268 research outputs found

    Transverse Meissner Physics of Planar Superconductors with Columnar Pins

    Get PDF
    The statistical mechanics of thermally excited vortex lines with columnar defects can be mapped onto the physics of interacting quantum particles with quenched random disorder in one less dimension. The destruction of the Bose glass phase in Type II superconductors, when the external magnetic field is tilted sufficiently far from the column direction, is described by a poorly understood non-Hermitian quantum phase transition. We present here exact results for this transition in (1+1)-dimensions, obtained by mapping the problem in the hard core limit onto one-dimensional fermions described by a non-Hermitian tight binding model. Both site randomness and the relatively unexplored case of bond randomness are considered. Analysis near the mobility edge and near the band center in the latter case is facilitated by a real space renormalization group procedure used previously for Hermitian quantum problems with quenched randomness in one dimension.Comment: 23 pages, 22 figure

    Beyond-mean-field theory for the statistics of neural coordination

    Full text link
    Understanding the coordination structure of neurons in neuronal networks is essential for unraveling the distributed information processing mechanisms in brain networks. Recent advancements in measurement techniques have resulted in an increasing amount of data on neural activities recorded in parallel, revealing largely heterogeneous correlation patterns across neurons. Yet, the mechanistic origin of this heterogeneity is largely unknown because existing theoretical approaches linking structure and dynamics in neural circuits are mostly restricted to average connection patterns. Here we present a systematic inclusion of variability in network connectivity via tools from statistical physics of disordered systems. We study networks of spiking leaky integrate-and-fire neurons and employ mean-field and linear-response methods to map the spiking networks to linear rate models with an equivalent neuron-resolved correlation structure. The latter models can be formulated in a field-theoretic language that allows using disorder-average and replica techniques to systematically derive quantitatively matching beyond-mean-field predictions for the mean and variance of cross-covariances as functions of the average and variability of connection patterns. We show that heterogeneity in covariances is not a result of variability in single-neuron firing statistics but stems from the sparse realization and variable strength of connections, as ubiquitously observed in brain networks. Average correlations between neurons are found to be insensitive to the level of heterogeneity, which in contrast modulates the variability of covariances across many orders of magnitude, giving rise to an efficient tuning of the complexity of coordination patterns in neuronal circuits

    Hidden connectivity structures control collective network dynamics

    Full text link
    A common approach to model local neural circuits is to assume random connectivity. But how is our choice of randomness informed by known network properties? And how does it affect the network's behavior? Previous approaches have focused on prescribing increasingly sophisticated statistics of synaptic strengths and motifs. However, at the same time experimental data on parallel dynamics of neurons is readily accessible. We therefore propose a complementary approach, specifying connectivity in the space that directly controls the dynamics - the space of eigenmodes. We develop a theory for a novel ensemble of large random matrices, whose eigenvalue distribution can be chosen arbitrarily. We show analytically how varying such distribution induces a diverse range of collective network behaviors, including power laws that characterize the dimensionality, principal components spectrum, autocorrelation, and autoresponse of neuronal activity. The power-law exponents are controlled by the density of nearly critical eigenvalues, and provide a minimal and robust measure to directly link observable dynamics and connectivity. The density of nearly critical modes also characterizes a transition from high to low dimensional dynamics, while their maximum oscillation frequency determines a transition from an exponential to power-law decay in time of the correlation and response functions. We prove that the wide range of dynamical behaviors resulting from the proposed connectivity ensemble is caused by structures that are invisible to a motif analysis. Their presence is captured by motifs appearing with vanishingly small probability in the number of neurons. Only reciprocal motifs occur with finite probability. In other words, a motif analysis can be blind to synaptic structures controlling the dynamics, which instead become apparent in the space of eigenmode statistics

    Integration of continuous-time dynamics in a spiking neural network simulator

    Full text link
    Contemporary modeling approaches to the dynamics of neural networks consider two main classes of models: biologically grounded spiking neurons and functionally inspired rate-based units. The unified simulation framework presented here supports the combination of the two for multi-scale modeling approaches, the quantitative validation of mean-field approaches by spiking network simulations, and an increase in reliability by usage of the same simulation code and the same network model specifications for both model classes. While most efficient spiking simulations rely on the communication of discrete events, rate models require time-continuous interactions between neurons. Exploiting the conceptual similarity to the inclusion of gap junctions in spiking network simulations, we arrive at a reference implementation of instantaneous and delayed interactions between rate-based models in a spiking network simulator. The separation of rate dynamics from the general connection and communication infrastructure ensures flexibility of the framework. We further demonstrate the broad applicability of the framework by considering various examples from the literature ranging from random networks to neural field models. The study provides the prerequisite for interactions between rate-based and spiking models in a joint simulation

    Learning Interacting Theories from Data

    Full text link
    One challenge of physics is to explain how collective properties arise from microscopic interactions. Indeed, interactions form the building blocks of almost all physical theories and are described by polynomial terms in the action. The traditional approach is to derive these terms from elementary processes and then use the resulting model to make predictions for the entire system. But what if the underlying processes are unknown? Can we reverse the approach and learn the microscopic action by observing the entire system? We use invertible neural networks (INNs) to first learn the observed data distribution. By the choice of a suitable nonlinearity for the neuronal activation function, we are then able to compute the action from the weights of the trained model; a diagrammatic language expresses the change of the action from layer to layer. This process uncovers how the network hierarchically constructs interactions via nonlinear transformations of pairwise relations. We test this approach on simulated data sets of interacting theories. The network consistently reproduces a broad class of unimodal distributions; outside this class, it finds effective theories that approximate the data statistics up to the third cumulant. We explicitly show how network depth and data quantity jointly improve the agreement between the learned and the true model. This work shows how to leverage the power of machine learning to transparently extract microscopic models from data

    Asymptotic Dynamics in Quantum Field Theory

    Get PDF
    A crucial element of scattering theory and the LSZ reduction formula is the assumption that the coupling vanishes at large times. This is known not to hold for the theories of the Standard Model and in general such asymptotic dynamics is not well understood. We give a description of asymptotic dynamics in field theories which incorporates the important features of weak convergence and physical boundary conditions. Applications to theories with three and four point interactions are presented and the results are shown to be completely consistent with the results of perturbation theory.Comment: 18 pages, 3 figure
    corecore